Avoiding Catastrophic Forgetting by a Dual-Network Memory Model Using a Chaotic Neural Network
نویسنده
چکیده
In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models. Keywords—catastrophic forgetting, chaotic neural network, complementary learning systems, dual-network.
منابع مشابه
Extraction of Patterns from a Hippocampal Network Using Chaotic Recall
In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. We have already proposed a biologically inspired dual-network memory model which can reduce catastrophic interference. Although two distinct networks of the model correspond to the ...
متن کاملDual-Network Memory Model for Temporal Sequences
In neural networks, when new patters are learned by a network, they radically interfere with previously stored patterns. This drawback is called catastrophic forgetting. We have already proposed a biologically inspired dual-network memory model which can much reduce this forgetting for static patterns. In this model, information is first stored in the hippocampal network, and thereafter, it is ...
متن کاملSelf-refreshing Som as a Semantic Memory Model
Natural and artificial cognitive systems suffer from forgetting information. However, in natural systems forgetting is typically gradual whereas in artificial systems forgetting is often catastrophic. Catastrophic forgetting is also a problem for the Self-Organizing Map (SOM) when used as a semantic memory model in a continuous learning task in a nonstationary environment. Methods based on rehe...
متن کاملAN IMPROVED CONTROLLED CHAOTIC NEURAL NETWORK FOR PATTERN RECOGNITION
A sigmoid function is necessary for creation a chaotic neural network (CNN). In this paper, a new function for CNN is proposed that it can increase the speed of convergence. In the proposed method, we use a novel signal for controlling chaos. Both the theory analysis and computer simulation results show that the performance of CNN can be improved remarkably by using our method. By means of this...
متن کاملNeural networks with a self-refreshing memory: knowledge transfer in sequential learning tasks without catastrophic forgetting
We explore a dual-network architecture with self-refreshing memory (Ans and Rousset 1997) which overcomes catastrophic forgetting in sequential learning tasks. Its principle is that new knowledge is learned along with an internally generated activity re ecting the network history. What mainly distinguishes this model from others using pseudorehearsal in feedforward multilayer networks is a rev...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010